A GPU solver for symmetric positive-definite matrices vs. traditional codes

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

"Compress and eliminate" solver for symmetric positive definite sparse matrices

We propose a new approximate factorization for solving linear systems with symmetric positive definite sparse matrices. In a nutshell the algorithm is to apply hierarchically block Gaussian elimination and additionally compress the fill-in. The systems that have efficient compression of the fill-in mostly arise from discretization of partial differential equations. We show that the resulting fa...

متن کامل

DDtBe for Band Symmetric Positive Definite Matrices

We present a new parallel factorization for band symmetric positive definite (s.p.d) matrices and show some of its applications. Let A be a band s.p.d matrix of order n and half bandwidth m. We show how to factor A as A =DDt Be using approximately 4nm2 jp parallel operations where p =21: is the number of processors. Having this factorization, we improve the time to solve Ax = b by a factor of m...

متن کامل

A Trace Bound for Positive Definite Connected Integer Symmetric Matrices

Let A be a connected integer symmetric matrix, i.e., A = (aij) ∈ Mn(Z) for some n, A = AT , and the underlying graph (vertices corresponding to rows, with vertex i joined to vertex j if aij 6= 0) is connected. We show that if all the eigenvalues of A are strictly positive, then tr(A) ≥ 2n− 1. There are two striking corollaries. First, the analogue of the Schur-SiegelSmyth trace problem is solve...

متن کامل

Toward Accelerating the Matrix Inversion Computation of Symmetric Positive-Definite Matrices on Heterogeneous GPU-Based Systems

The goal of this paper is to implement an efficient matrix inversion of symmetric positive-definite matrices on heterogeneous GPU-based systems. The matrix inversion procedure can be split into three stages: computing the Cholesky factorization, inverting the Cholesky factor and calculating the product of the inverted Cholesky factor with its transpose to get the final inverted matrix. Using hi...

متن کامل

Supervised LogEuclidean Metric Learning for Symmetric Positive Definite Matrices

Metric learning has been shown to be highly effective to improve the performance of nearest neighbor classification. In this paper, we address the problem of metric learning for symmetric positive definite (SPD) matrices such as covariance matrices, which arise in many real-world applications. Naively using standard Mahalanobis metric learning methods under the Euclidean geometry for SPD matric...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computers & Mathematics with Applications

سال: 2019

ISSN: 0898-1221

DOI: 10.1016/j.camwa.2019.02.034